6 research outputs found

    UltrARsound: in situ visualization of live ultrasound images using HoloLens 2

    Get PDF
    Purpose: Augmented Reality (AR) has the potential to simplify ultrasound (US) examinations which usually require a skilled and experienced sonographer to mentally align narrow 2D cross-sectional US images in the 3D anatomy of the patient. This work describes and evaluates a novel approach to track retroreflective spheres attached to the US probe using an inside-out technique with the AR glasses HoloLens 2. Finally, live US images are displayed in situ on the imaged anatomy. Methods: The Unity application UltrARsound performs spatial tracking of the US probe and attached retroreflective markers using the depth camera integrated into the AR glasses—thus eliminating the need for an external tracking system. Additionally, a Kalman filter is implemented to improve the noisy measurements of the camera. US images are streamed wirelessly via the PLUS toolkit to HoloLens 2. The technical evaluation comprises static and dynamic tracking accuracy, frequency and latency of displayed images. Results: Tracking is performed with a median accuracy of 1.98 mm/1.81∘ for the static setting when using the Kalman filter. In a dynamic scenario, the median error was 2.81 mm/1.70∘ . The tracking frequency is currently limited to 20 Hz. 83% of the displayed US images had a latency lower than 16 ms.Open Access funding enabled and organized by Projekt DEAL. This work was supported by a fellowship within the IFI programme of the German Academic Exchange Service (DAAD)

    Combining Surgical Navigation and 3D Printing for Less Invasive Pelvic Tumor Resections

    Get PDF
    Surgical interventions for musculoskeletal tumor resection are particularly challenging in the pelvic region due to their anatomical complexity and proximity to vital structures. Several techniques, such as surgical navigation or patient-specific instruments (PSIs), have been introduced to ensure accurate resection margins. However, their inclusion usually modifies the surgical approach making it more invasive. In this study, we propose to combine both techniques to reduce this invasiveness while improving the precision of the intervention. PSIs are used for image-to-patient registration and the installation of the navigation’s reference frame. We tested and validated the proposed setup in a realistic surgical scenario with six cadavers (12 hemipelvis). The data collected during the experiment allowed us to study different resection scenarios, identifying the patient-specific instrument configurations that optimize navigation accuracy. The mean values obtained for maximum osteotomy deviation or MOD (maximum distance between the planned and actual osteotomy for each simulated scenario) were as follows: for ilium resections, 5.9 mm in the iliac crest and 1.65 mm in the supra-acetabular region, and for acetabulum resections, 3.44 mm, 1.88 mm, and 1.97 mm in the supra-acetabular, ischial and pubic regions, respectively. Additionally, those cases with image-to-patient registration error below 2 mm ensured MODs of 2 mm or lower. Our results show how combining several PSIs leads to low navigation errors and high precision while providing a less invasive surgical approach.This work was supported by the Ministerio de Ciencia e Innovación, Instituto de Salud Carlos III, and European Regional Development Fund ‘‘Una manera de hacer Europa,’’ under Project PI18/01625.Publicad

    Augmented reality as a tool to guide psi placement in pelvic tumor resections

    Get PDF
    Patient-specific instruments (PSIs) have become a valuable tool for osteotomy guidance in complex surgical scenarios such as pelvic tumor resection. They provide similar accuracy to surgical navigation systems but are generally more convenient and faster. However, their correct placement can become challenging in some anatomical regions, and it cannot be verified objectively during the intervention. Incorrect installations can result in high deviations from the planned osteotomy, increasing the risk of positive resection margins. In this work, we propose to use augmented reality (AR) to guide and verify PSIs placement. We designed an experiment to assess the accuracy provided by the system using a smartphone and the HoloLens 2 and compared the results with the conventional freehand method. The results showed significant differences, where AR guidance prevented high osteotomy deviations, reducing maximal deviation of 54.03 mm for freehand placements to less than 5 mm with AR guidance. The experiment was performed in two versions of a plastic threedimensional (3D) printed phantom, one including a silicone layer to simulate tissue, providing more realism. We also studied how differences in shape and location of PSIs affect their accuracy, concluding that those with smaller sizes and a homogeneous target surface are more prone to errors. Our study presents promising results that prove AR’s potential to overcome the present limitations of PSIs conveniently and effectively.This research was funded by project PI18/01625 (Ministerio de Ciencia e Innovación, Instituto de Salud Carlos III and European Regional Development Fund “Una manera de hacer Europa”)

    Augmented reality in computer-assisted interventions based on patient-specific 3D printed reference

    Get PDF
    Augmented reality (AR) can be an interesting technology for clinical scenarios as an alternative to conventional surgical navigation. However, the registration between augmented data and real-world spaces is a limiting factor. In this study, the authors propose a method based on desktop three-dimensional (3D) printing to create patient-specific tools containing a visual pattern that enables automatic registration. This specific tool fits on the patient only in the location it was designed for, avoiding placement errors. This solution has been developed as a software application running on Microsoft HoloLens. The workflow was validated on a 3D printed phantom replicating the anatomy of a patient presenting an extraosseous Ewing's sarcoma, and then tested during the actual surgical intervention. The application allowed physicians to visualise the skin, bone and tumour location overlaid on the phantom and patient. This workflow could be extended to many clinical applications in the surgical field and also for training and simulation, in cases where hard body structures are involved. Although the authors have tested their workflow on AR head mounted display, they believe that a similar approach can be applied to other devices such as tablets or smartphones
    corecore